Robust subspace computation using L1 norm

نویسندگان

  • Qifa Ke
  • Takeo Kanade
چکیده

Linear subspace has many important applications in computer vision, such as structure from motion, motion estimation, layer extraction, object recognition, and object tracking. Singular Value Decomposition (SVD) algorithm is a standard technique to compute the subspace from the input data. The SVD algorithm, however, is sensitive to outliers as it uses L2 norm metric, and it can not handle missing data either. In this paper, we propose using L1 norm metric to compute the subspace. We show that it is robust to outliers and can handle missing data. We present two algorithms to optimize the L1 norm metric: the weighted median algorithm and the quadratic programming algorithm. The views and conclusions contained in this document are those of the authors and should not be interpreted as representing the official policies, either expressed or implied, of the Carnegie Mellon University or the U.S. Government.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Robust blind methods using $\ell_p$ quasi norms

It was shown in a previous work that some blind methods can be made robust to channel order overmodeling by using the l1 or lp quasi-norms. However, no theoretical argument has been provided to support this statement. In this work, we study the robustness of subspace blind based methods using l1 or lp quasi-norms. For the l1 norm, we provide the sufficient and necessary condition that the chann...

متن کامل

Learning Robust Graph Regularisation for Subspace Clustering

Various subspace clustering methods have benefited from introducing a graph regularisation term in their objective functions. In this work, we identify two critical limitations of the graph regularisation term employed in existing subspace clustering models and provide solutions for both of them. First, the squared l2-norm used in the existing term is replaced by a l1-norm term to make the regu...

متن کامل

Some Options for L1-subspace Signal Processing

We describe ways to define and calculate L1-norm signal subspaces which are less sensitive to outlying data than L2-calculated subspaces. We focus on the computation of the L1 maximum-projection principal component of a data matrix containing N signal samples of dimension D and conclude that the general problem is formally NP-hard in asymptotically large N , D. We prove, however, that the case ...

متن کامل

L1-norm-based (2D)PCA

Traditional bidirectional two-dimension (2D) principal component analysis ((2D)PCA-L2) is sensitive to outliers because its objective function is the least squares criterion based on L2-norm. This paper proposes a simple but effective L1-norm-based bidirectional 2D principal component analysis ((2D)PCA-L1), which jointly takes advantage of the merits of bidirectional 2D subspace learning and L1...

متن کامل

Optimal Algorithms for L1-subspace Signal Processing

Abstract We describe ways to define and calculate L1-norm signal subspaces which are less sensitive to outlying data than L2-calculated subspaces. We start with the computation of the L1 maximum-projection principal component of a data matrix containing N signal samples of dimension D. We show that while the general problem is formally NP-hard in asymptotically large N , D, the case of engineer...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2003